Conditions Under Which Conditional Independence and Scoring Methods Lead to Identical Selection of Bayesian Network Models

نویسنده

  • Robert G. Cowell
چکیده

It is often stated in papers tackling the task of selecting a Bayesian network structure from data that there are these two distinct approaches: (i) Apply conditional indepen­ dence tests . when testing for the presence or otherwise of edges; (ii) Search the model space using a scoring metric. Here I argue that for complete data and a given node ordering this division is largely a myth, by showing that cross entropy methods for checking conditional independence are mathematically identical to methods based upon discriminating between models by their overall goodness-of-fit logarithmic scores.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Comparison of Neural Network Models, Vector Auto Regression (VAR), Bayesian Vector-Autoregressive (BVAR), Generalized Auto Regressive Conditional Heteroskedasticity (GARCH) Process and Time Series in Forecasting Inflation in ‎Iran‎

‎This paper has two aims. The first is forecasting inflation in Iran using Macroeconomic variables data in Iran (Inflation rate, liquidity, GDP, prices of imported goods and exchange rates) , and the second is comparing the performance of forecasting vector auto regression (VAR), Bayesian Vector-Autoregressive (BVAR), GARCH, time series and neural network models by which Iran's inflation is for...

متن کامل

Non-intuitive conditional independence facts hold in models of network data

Many social scientists and researchers across a wide range of fields focus on analyzing a single causal dependency or a conditional model of some outcome variable. However, to reason about interventions or conditional independence, it is useful to construct a joint model of a domain. Researchers in computer science, statistics, and philosophy have developed representations (e.g., Bayesian netwo...

متن کامل

Selection Bias Correction in Supervised Learning with Importance Weight. (L'apprentissage des modèles graphiques probabilistes et la correction de biais sélection)

In the theory of supervised learning, the identical assumption, i.e. the training and the test samples are drawn from the same probability distribution, plays a crucial role. Unfortunately, this essential assumption is often violated in the presence of selection bias. Under such condition, the standard supervised learning frameworks may suffer a significant bias. In this thesis, we use the impo...

متن کامل

Bayesian Network Learning with Discrete Case-Control Data

We address the problem of learning Bayesian networks from discrete, unmatched case-control data using specialized conditional independence tests. Those tests can also be used for learning other types of graphical models or for feature selection. We also propose a post-processing method that can be applied in conjunction with any Bayesian network learning algorithm. In simulations we show that o...

متن کامل

Learning Bayesian Networks

We examine Bayesian methods for learning Bayesian networks from a combination of prior knowledge and statistical data. In particular, we develop simple methods for generating priors for Bayesian-network parameters. Our work is a generalization of previous work that has concentrated on Bayesian networks containing only discrete variables and (to a lesser extent) on Gaussian networks. We introduc...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2001